Global Linear Convergence of Evolution Strategies on More than Smooth Strongly Convex Functions

نویسندگان

چکیده

Evolution strategies (ESs) are zeroth-order stochastic black-box optimization heuristics invariant to monotonic transformations of the objective function. They evolve a multivariate normal distribution, from which candidate solutions generated. Among different variants, CMA-ES is nowadays recognized as one state-of-the-art optimizers for difficult problems. Despite ample empirical evidence that ESs with step-size control mechanism converge linearly, theoretical guarantees linear convergence have been established only on limited classes functions. In particular, results convex functions missing, where and also first-order methods often analyzed. this paper, we establish almost sure bound expected hitting time an ES family, namely, $(1+1)_\kappa$-ES, includes (1+1)-ES (generalized) one-fifth success rule abstract covariance matrix adaptation bounded condition number, broad class The analysis holds positively homogeneous quadratically functions, latter particularly transformation strongly Lipschitz continuous gradient. As far authors know, first work proves such

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization of Smooth and Strongly Convex Functions

A. Proof of Lemma 1 We need the following lemma that characterizes the property of the extra-gradient descent. Lemma 8 (Lemma 3.1 in (Nemirovski, 2005)). Let Z be a convex compact set in Euclidean space E with inner product 〈·, ·〉, let ‖ · ‖ be a norm on E and ‖ · ‖∗ be its dual norm, and let ω(z) : Z 7→ R be a α-strongly convex function with respect to ‖ · ‖. The Bregman distance associated wi...

متن کامل

On the quadratic support of strongly convex functions

In this paper, we first introduce the notion of $c$-affine functions for $c> 0$. Then we deal with some properties of strongly convex functions in real inner product spaces by using a quadratic support function at each point which is $c$-affine. Moreover, a Hyers–-Ulam stability result for strongly convex functions is shown.

متن کامل

Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions

Linear convergence rates of descent methods for unconstrained minimization are usually proven under the assumption that the objective function is strongly convex. Recently it was shown that the weaker assumption of restricted strong convexity suffices for linear convergence of the ordinary gradient descent method. A decisive difference to strong convexity is that the set of minimizers of a rest...

متن کامل

ON STRONGLY h-CONVEX FUNCTIONS

We introduce the notion of strongly h-convex functions (defined on a normed space) and present some properties and representations of such functions. We obtain a characterization of inner product spaces involving the notion of strongly h-convex functions. Finally, a Hermite–Hadamard–type inequality for strongly h-convex functions is given.

متن کامل

on the quadratic support of strongly convex functions

in this paper, we first introduce the notion of $c$-affine functions for $c> 0$.then we deal with some properties of strongly convex functions in real inner product spaces by using a quadratic support function at each point which is $c$-affine. moreover, a hyers–-ulam stability result for strongly convex functions is shown.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Optimization

سال: 2022

ISSN: ['1095-7189', '1052-6234']

DOI: https://doi.org/10.1137/20m1373815